An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization
نویسنده
چکیده
In this paper we suggest a new conjugate gradient algorithm that for all both the descent and the conjugacy conditions are guaranteed. The search direction is selected as a linear combination of 0 k ≥ 1 k g + − and where , k s 1 1 ( ) k k g f x + + = ∇ , and the coefficients in this linear combination are selected in such a way that both the descent and the conjugacy condition are satisfied at every iteration. It is shown that for general nonlinear functions with bounded Hessian the algorithm with strong Wolfe line search generates directions bounded away from infinity. The algorithm uses an acceleration scheme that modify the steplength 1 k k s x x + = − k k α in such a manner as to improve the reduction of the function values along the iterations. Numerical comparisons with some conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the computational scheme outperform the known conjugate gradient algorithms like Hestenes and Stiefel, Polak, Ribière and Polyak, Dai and Yuan or hybrid Dai and Yuan as well as CG_DESCENT by Hager and Zhang with Wolfe line search.
منابع مشابه
A Note on the Descent Property Theorem for the Hybrid Conjugate Gradient Algorithm CCOMB Proposed by Andrei
In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...
متن کاملAnother accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization
In this paper we suggest another accelerated conjugate gradient algorithm that for all both the descent and the conjugacy conditions are guaranteed. The search direction is selected as where , The coefficients 0 k ≥ 1 1 1 1 ( / ) ( / ) T T T T k k k k k k k k k k k k k k d g y g y s s t s g y s θ + + + + = − + − , s 1 1 ( ) k k g f x + + = ∇ 1 . k k k s x x + = − k θ and in this linear combinat...
متن کاملAnother Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions for Large-scale Unconstrained Optimization
In this paper we suggest another accelerated conjugate gradient algorithm that for all both the descent and the conjugacy conditions are guaranteed. The search direction is selected as where , The coefficients 0 k ≥ 1 1 1 1 ( / ) ( / ) T T T T k k k k k k k k k k k k k k d g y g y s s t s g y s θ + + + + = − + − , s 1 1 ( ) k k g f x + + = ∇ 1 . k k k s x x + = − k θ and in this linear combinat...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملAn eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Optimization Methods and Software
دوره 27 شماره
صفحات -
تاریخ انتشار 2012